24 research outputs found

    Classification et Caractérisation de l'Expression Corporelle des Emotions dans des Actions Quotidiennes

    Get PDF
    The work conducted in this thesis can be summarized into four main steps.Firstly, we proposed a multi-level body movement notation system that allows the description ofexpressive body movement across various body actions. Secondly, we collected a new databaseof emotional body expression in daily actions. This database constitutes a large repository of bodilyexpression of emotions including the expression of 8 emotions in 7 actions, combining video andmotion capture recordings and resulting in more than 8000 sequences of expressive behaviors.Thirdly, we explored the classification of emotions based on our multi-level body movement notationsystem. Random Forest approach is used for this purpose. The advantage of using RandomForest approach in our work is double-fold : 1) reliability of the classification model and 2) possibilityto select a subset of relevant features based on their relevance measures. We also comparedthe automatic classification of emotions with human perception of emotions expressed in differentactions. Finally, we extracted the most relevant features that capture the expressive content of themotion based on the relevance measure of features returned by the Random Forest model. Weused this subset of features to explore the characterization of emotional body expression acrossdifferent actions. A Decision Tree model was used for this purpose.Ce travail de thèse peut être résumé en quatre étapes principales. Premièrement, nousavons proposé un système d’annotation multi-niveaux pour décrire le mouvement corporel expressif dansdifférentes actions. Deuxièmement, nous avons enregistré une base de données de l’expression corporelledes émotions dans des actions quotidiennes. Cette base de données constitue un large corpus de comportementsexpressifs considérant l’expression de 8 émotions dans 7 actions quotidiennes, combinant à la fois lesdonnées audio-visuelle et les données de capture de mouvement et donnant lieu à plus que 8000 séquencesde mouvement expressifs. Troisièmement, nous avons exploré la classification des émotions en se basantsur notre système d’annotation multi-niveaux. L’approche des forêts aléatoires est utilisée pour cette fin. L’utilisationdes forêts aléatoires dans notre travail a un double objectif : 1) la fiabilité du modèle de classification,et 2) la possibilité de sélectionner un sous-ensemble de paramètres pertinents en se basant sur la mesured’importance retournée par le modèle. Nous avons aussi comparé la classification automatique des émotionsavec la perception humaine des émotions exprimées dans différentes actions. Finalement, nous avonsextrait les paramètres les plus pertinents qui retiennent l’expressivité du mouvement en se basant sur la mesured’importance retournée par le modèle des forêts aléatoires. Nous avons utilisé ce sous-ensemble deparamètres pour explorer la caractérisation de l’expression corporelle des émotions dans différentes actionsquotidiennes. Un modèle d’arbre de décision a été utilisé pour cette fin

    Vers des Agents Conversationnels Animés dotés d'émotions et d'attitudes sociales

    No full text
    International audienceIn this article, we propose an architecture of a socio-affective Embodied Conversational Agent (ECA). The different computational models of the architecture enable an ECA to express emotions and social attitudes during an interaction with a user. Based on corpora of actors expressing emotions, models have been defined to compute the emotional facial expressions of an ECA and the characteristics of its corporal movements. A user-perceptive approach has been used to design models to define how an ECA should adapt its non-verbal behavior according to the social attitude the ECA wants to display and the behavior of its interlocutor. The emotions and the social attitudes to express are computed by cognitive models presented in this article.Dans cet article, nous proposons une architecture d'un Agent Conversationnel Animé (ACA) socio-affectif. Les différents modèles computationnels sous-jacents à cette architecture, permettant de donner la capacité à un ACA d'exprimer des émotions et des attitudes sociales durant son interaction avec l'utilisateur, sont présentés. A partir de corpus d'individus exprimant des émotions, des modèles permettant de calculer l'expression faciale émotionnelle d'un ACA ainsi que les caractéristiques de ses mouvements du corps ont été définis. Fondés sur une approche centrée sur la perception de l'utilisateur, des modèles permettant de calculer comment un ACA doit adapter son comportement non-verbal suivant l'attitude sociale qu'il souhaite exprimer et suivant le comportement de son interlocuteur ont été construits. Le calcul des émotions et des attitudes sociales à exprimer est réalisé par des modèles cognitifs présentés dans cet article

    Perception of Emotions and Body Movement in the Emilya Database

    No full text
    International audienceIn this paper, we examine the perception of emotions as well as the characterization and the classification of emotional body expressions based on perceptual body cues ratings. Emilya (EMotional body expression In daILY Actions), a database of body expressions of 8 emotions (including Neutral) in 7 daily actions performed by 11 actors, is used for these purposes. A perceptual study is conducted to explore four issues: 1) how expressed emotions are perceived by humans, 2) how emotion recognition by humans differs across daily actions, 3) how expressed emotions are characterized by humans through body cues, and 4) how emotions are automatically classified based on human rating of body cues. Across all the actions, most of the expressed emotions were correctly identified, but some were confused (e.g. Shame and Sadness). Confusions occurring at the level of emotion perception may be due to a lack of contextual factors (Emilya contains body movement of daily actions without reference to a context), to a similarity of bodily expressions, but also to the lack of other modalities that may contribute to a better recognition of bodily expression of these emotions (e.g. facial expressions). In the paper, we detail and discuss the results from these different studies

    Dynamic stimuli visualization for experimental studies of body language

    No full text
    International audienceUnderstanding human body behavior have relied on perceptive studies. Lately, several experimental studies have been conducted withvirtual avatars that reproduce human body movements. The visualization of human body behaviors stimuli, using avatars, may introducebias for human perception comprehension. Indeed, the choice of the virtual camera trajectory and orientation affects the display of thestimuli. In this paper, we propose control functions for the virtual camera.</p

    Perception of Emotions and Body Movement in the Emilya Database

    No full text
    International audienceIn this paper, we examine the perception of emotions as well as the characterization and the classification of emotional body expressions based on perceptual body cues ratings. Emilya (EMotional body expression In daILY Actions), a database of body expressions of 8 emotions (including Neutral) in 7 daily actions performed by 11 actors, is used for these purposes. A perceptual study is conducted to explore four issues: 1) how expressed emotions are perceived by humans, 2) how emotion recognition by humans differs across daily actions, 3) how expressed emotions are characterized by humans through body cues, and 4) how emotions are automatically classified based on human rating of body cues. Across all the actions, most of the expressed emotions were correctly identified, but some were confused (e.g. Shame and Sadness). Confusions occurring at the level of emotion perception may be due to a lack of contextual factors (Emilya contains body movement of daily actions without reference to a context), to a similarity of bodily expressions, but also to the lack of other modalities that may contribute to a better recognition of bodily expression of these emotions (e.g. facial expressions). In the paper, we detail and discuss the results from these different studies

    Multi-level classification of emotional body expression

    No full text
    International audienc

    Collection and characterization of emotional body behaviors

    No full text
    International audience<p>This paper addressees two issues in modeling bodily expressionof emotions; emotional behaviors collection and expressivemovement characterization. In this paper, we describeour body movement coding schema intended to the characterizationof bodily emotional expression in different movementtasks. We describe as well the database that we usefor the characterization of emotion expression in differentmovement tasks through the proposed body movement codingschema.</p

    Head, Shoulders and Hips Behaviors during Turning

    No full text
    International audience<p>Turning behavior is part of the basic library of motor synergies.It involves a complex interplay between the different body parts.In this study, we investigate the behavior of shoulders, hips and headduring walking and turning tasks with various emotional states and differentangles. We found that shoulders and hips follow a strong linearrelationship during turning with different angles and styles of walk, whilethe head behavior is affected by these variables.</p

    Perception of Emotions and Body Movement in the Emilya Database

    No full text
    International audienc
    corecore